Goto

Collaborating Authors

 temporal spike sequence learning


Temporal Spike Sequence Learning via Backpropagation for Deep Spiking Neural Networks

Neural Information Processing Systems

Spiking neural networks (SNNs) are well suited for spatio-temporal learning and implementations on energy-efficient event-driven neuromorphic processors. However, existing SNN error backpropagation (BP) methods lack proper handling of spiking discontinuities and suffer from low performance compared with the BP methods for traditional artificial neural networks. In addition, a large number of time steps are typically required to achieve decent performance, leading to high latency and rendering spike-based computation unscalable to deep architectures. We present a novel Temporal Spike Sequence Learning Backpropagation (TSSL-BP) method for training deep SNNs, which breaks down error backpropagation across two types of inter-neuron and intra-neuron dependencies and leads to improved temporal learning precision.




Review for NeurIPS paper: Temporal Spike Sequence Learning via Backpropagation for Deep Spiking Neural Networks

Neural Information Processing Systems

Weaknesses: Some points that need clarification: 1)The authors argue that their approach is better because it allows neural networks to respond within very few timesteps. The results are reported after 5 timesteps. I am not sure if for such very fast respons, the dynamics of the network are playing any role in this case. For example, training in MNIST requires the correct output neuron to emit a spike in the second time step. It seems there is just one volley of activity passing instantaneously throughout the network.


Review for NeurIPS paper: Temporal Spike Sequence Learning via Backpropagation for Deep Spiking Neural Networks

Neural Information Processing Systems

The reviewers all agreed that this is an excellent paper that makes a clear contribution to the spiking NN literature. There were some concerns, but these were largely addressed in rebuttal.


Temporal Spike Sequence Learning via Backpropagation for Deep Spiking Neural Networks

Neural Information Processing Systems

Spiking neural networks (SNNs) are well suited for spatio-temporal learning and implementations on energy-efficient event-driven neuromorphic processors. However, existing SNN error backpropagation (BP) methods lack proper handling of spiking discontinuities and suffer from low performance compared with the BP methods for traditional artificial neural networks. In addition, a large number of time steps are typically required to achieve decent performance, leading to high latency and rendering spike-based computation unscalable to deep architectures. We present a novel Temporal Spike Sequence Learning Backpropagation (TSSL-BP) method for training deep SNNs, which breaks down error backpropagation across two types of inter-neuron and intra-neuron dependencies and leads to improved temporal learning precision. TSSL-BP efficiently trains deep SNNs within a much shortened temporal window of a few steps while improving the accuracy for various image classification datasets including CIFAR10.